Tensor Decomposition via Simultaneous Power Iteration (Supplementary Material)
نویسندگان
چکیده
Proof. One can relate the singular values of the Hadamard product Z 2 to those of the Kronecker product Z ⊗ Z. In particular, as Z Z can be obtain from Z ⊗ Z by deleting some rows and columns, Lemma A.1 tells us that σmin(Z Z) ≥ σmin(Z ⊗ Z) and σmax(Z Z) ≤ σmax(Z ⊗ Z). Then the lemma follows as the Kronecker product Z⊗Z is known to have the property that σmin(Z⊗ Z) = (σmin(Z)) 2 and σmax(Z ⊗ Z) = (σmax(Z)).
منابع مشابه
Tensor Decomposition via Simultaneous Power Iteration
Tensor decomposition is an important problem with many applications across several disciplines, and a popular approach for this problem is the tensor power method. However, previous works with theoretical guarantee based on this approach can only find the top eigenvectors one after one, unlike the case for matrices. In this paper, we show how to find the eigenvectors simultaneously with the hel...
متن کاملGuaranteed Non-Orthogonal Tensor Decomposition via Alternating Rank-1 Updates
In this paper, we provide local and global convergence guarantees for recovering CP (Candecomp/Parafac) tensor decomposition. The main step of the proposed algorithm is a simple alternating rank-1 update which is the alternating version of the tensor power iteration adapted for asymmetric tensors. Local convergence guarantees are established for third order tensors of rank k in d dimensions, wh...
متن کاملFast and Guaranteed Tensor Decomposition via Sketching
Tensor CANDECOMP/PARAFAC (CP) decomposition has wide applications in statistical learning of latent variable models and in data mining. In this paper, we propose fast and randomized tensor CP decomposition algorithms based on sketching. We build on the idea of count sketches, but introduce many novel ideas which are unique to tensors. We develop novel methods for randomized computation of tenso...
متن کاملTensor Decompositions for Learning Latent Variable Models Report Title
This work considers a computationally and statistically e?cient parameter estimation method for a wide class of latent variable models—including Gaussian mixture models, hidden Markov models, and latent Dirichlet allocation—which exploits a certain tensor structure in their loworder observable moments (typically, of secondand third-order). Speci?cally, parameter estimation is reduced to the pro...
متن کاملOnline and Differentially-Private Tensor Decomposition
Tensor decomposition is an important tool for big data analysis. In this paper,we resolve many of the key algorithmic questions regarding robustness, memoryefficiency, and differential privacy of tensor decomposition. We propose simplevariants of the tensor power method which enjoy these strong properties. We presentthe first guarantees for online tensor power method which has a...
متن کامل